Relative Entropy Rate between a Markov Chain and Its Corresponding Hidden Markov Chain
نویسندگان
چکیده مقاله:
In this paper we study the relative entropy rate between a homogeneous Markov chain and a hidden Markov chain defined by observing the output of a discrete stochastic channel whose input is the finite state space homogeneous stationary Markov chain. For this purpose, we obtain the relative entropy between two finite subsequences of above mentioned chains with the help of the definition of relative entropy between two random variables then we define the relative entropy rate between these stochastic processes and study the convergence of it.
منابع مشابه
ADK Entropy and ADK Entropy Rate in Irreducible- Aperiodic Markov Chain and Gaussian Processes
In this paper, the two parameter ADK entropy, as a generalized of Re'nyi entropy, is considered and some properties of it, are investigated. We will see that the ADK entropy for continuous random variables is invariant under a location and is not invariant under a scale transformation of the random variable. Furthermore, the joint ADK entropy, conditional ADK entropy, and chain rule of this ent...
متن کاملTaylor Expansion for the Entropy Rate of Hidden Markov Chains
We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate ...
متن کاملMarkov Chain and Hidden Markov Model
Figure 1. Finite state machine for a Markov chain X0 → X1 → X2 → · · · → Xn where the random variables Xi’s take values from I = {S1, S2, S3}. The numbers T (i, j)’s on the arrows are the transition probabilities such that Tij = P (Xt+1 = Sj|Xt = Si). Definition 1.2. We say that (Xn)n≥0 is a Markov chain with initial distribution λ and transition matrix T if (i) X0 has distribution λ; (ii) for ...
متن کاملRelative entropy between Markov transition rate matrices
We derive the relative entropy between two Markov transition rate matrices from sample path considerations. This relative entropy is interpreted as a \level 2.5" large deviations action functional. That is, the level two large deviations action functional for empirical distributions of continuous-time Markov chains can be derived from the relative entropy using the contraction mapping principle...
متن کاملInference of Markov Chain: AReview on Model Comparison, Bayesian Estimation and Rate of Entropy
This article has no abstract.
متن کاملThe Relative Entropy Rate For Two Hidden Markov Processes
The relative entropy rate is a natural and useful measure of distance between two stochastic processes. In this paper we study the relative entropy rate between two Hidden Markov Processes (HMPs), which is of both theoretical and practical importance. We give new results showing analyticity, representation using Lyapunov exponents, and Taylor expansion for the relative entropy rate of two discr...
متن کاملمنابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ذخیره در منابع من قبلا به منابع من ذحیره شده{@ msg_add @}
عنوان ژورنال
دوره 8 شماره 1
صفحات 97- 110
تاریخ انتشار 2011-09
با دنبال کردن یک ژورنال هنگامی که شماره جدید این ژورنال منتشر می شود به شما از طریق ایمیل اطلاع داده می شود.
میزبانی شده توسط پلتفرم ابری doprax.com
copyright © 2015-2023